# 32K long text processing

Kanana 1.5 8b Instruct 2505 GGUF
Apache-2.0
Kanana 1.5 is the new version of the Kanana model series, with significant improvements in coding, mathematics, and function calling capabilities, capable of processing inputs up to 32K tokens, and up to 128K tokens when using YaRN.
Large Language Model Transformers Supports Multiple Languages
K
Mungert
606
2
Qwen3 4B INT8
Apache-2.0
A large language model with 4B parameters based on the Hugging Face transformers library, supporting functions such as text generation, thinking mode switching, tool invocation, and long text processing.
Large Language Model Transformers
Q
zhiqing
1,904
1
Kanana 1.5 8b Base
Apache-2.0
Kanana 1.5 is a bilingual large language model developed by Kakao Corporation, supporting English and Korean, with significant improvements in programming, mathematics, and function calling capabilities, natively supporting a 32K tokens context length
Large Language Model Transformers Supports Multiple Languages
K
kakaocorp
432
7
Falcon3
Apache-2.0
Falcon3-10B-Instruct is an open-source foundational model from the Falcon3 series, featuring 10 billion parameters, specializing in high-quality instruction-following tasks, supporting multilingual processing with a context length of up to 32K tokens.
Large Language Model
F
cortexso
244
1
Midnight Miqu 103B V1.5
A 103B hybrid model based on Miqu, supporting 32K context length, for personal use only
Large Language Model Transformers
M
FluffyKaeloky
63
18
PULI LlumiX 32K
PULI LlumiX 32K is a large language model based on LLaMA-2-7B-32K, continuously pre-trained on Hungarian and English datasets, supporting 32K context length.
Large Language Model Transformers Supports Multiple Languages
P
NYTK
453
11
Midnight Miqu 103B V1.0
Other
A 103B parameter hybrid model based on the leaked Miqu model, supporting 32K context length
Large Language Model Transformers
M
sophosympatheia
18
13
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase